Monte Carlo technique - translation to russian
Diclib.com
ChatGPT AI Dictionary
Enter a word or phrase in any language 👆
Language:

Translation and analysis of words by ChatGPT artificial intelligence

On this page you can get a detailed analysis of a word or phrase, produced by the best artificial intelligence technology to date:

  • how the word is used
  • frequency of use
  • it is used more often in oral or written speech
  • word translation options
  • usage examples (several phrases with translation)
  • etymology

Monte Carlo technique - translation to russian

BROAD CLASS OF COMPUTATIONAL ALGORITHMS USING RANDOM SAMPLING TO OBTAIN NUMERICAL RESULTS
Monte Carlo Method; Monte Carlo simulation; Monte Carlo methods; Monte Carlo analysis; Monte-Carlo methods; Monte Carlo simulations; Monte Carlo sampling; Monte Carlo model; Monte Carlo Simulation; Monte-Carlo method; Monte carlo method; Monte Carlo Model; Monte Carlo simulation techniques; Monte Carlo simulation technique; Monte carlo software; Monte carlo methods; Monte Carlo calculation; Monte carlo simulation; Monte carlo Simulation; Monte Carlo run; Monte carlo run; Monte-Carlo simulation; Monte Carlo experiment; Monte Carlo Methods; Monte Carlo experiments; Applications of Monte Carlo methods; Monte Carlo Experiments
  • Errors reduce by a factor of <math>\scriptstyle 1/\sqrt{N}</math>
  • Monte-Carlo integration works by comparing random points with the value of the function
  •  Monte Carlo method applied to approximating the value of {{pi}}.

Monte Carlo technique      
метод статистических испытаний, метод Монте-Карло
Monte Carlo method         

[mɔnti'kɑ:ləu'meθəd]

общая лексика

метод Монте-Карло

метод статистических испытаний

Monte Carlo calculation         
расчёт методом Монте-Карло

Definition

МОНТЕ-КАРЛО
(Monte Carlo) , город в Монако, на Средиземном м. Ок. 12 тыс. жителей. Климатический курорт; казино; центр туризма и банковских операций. Судоремонт. Музей изящных искусств.

Wikipedia

Monte Carlo method

Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches. Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.

In physics-related problems, Monte Carlo methods are useful for simulating systems with many coupled degrees of freedom, such as fluids, disordered materials, strongly coupled solids, and cellular structures (see cellular Potts model, interacting particle systems, McKean–Vlasov processes, kinetic models of gases).

Other examples include modeling phenomena with significant uncertainty in inputs such as the calculation of risk in business and, in mathematics, evaluation of multidimensional definite integrals with complicated boundary conditions. In application to systems engineering problems (space, oil exploration, aircraft design, etc.), Monte Carlo–based predictions of failure, cost overruns and schedule overruns are routinely better than human intuition or alternative "soft" methods.

In principle, Monte Carlo methods can be used to solve any problem having a probabilistic interpretation. By the law of large numbers, integrals described by the expected value of some random variable can be approximated by taking the empirical mean (a.k.a. the 'sample mean') of independent samples of the variable. When the probability distribution of the variable is parameterized, mathematicians often use a Markov chain Monte Carlo (MCMC) sampler. The central idea is to design a judicious Markov chain model with a prescribed stationary probability distribution. That is, in the limit, the samples being generated by the MCMC method will be samples from the desired (target) distribution. By the ergodic theorem, the stationary distribution is approximated by the empirical measures of the random states of the MCMC sampler.

In other problems, the objective is generating draws from a sequence of probability distributions satisfying a nonlinear evolution equation. These flows of probability distributions can always be interpreted as the distributions of the random states of a Markov process whose transition probabilities depend on the distributions of the current random states (see McKean–Vlasov processes, nonlinear filtering equation). In other instances we are given a flow of probability distributions with an increasing level of sampling complexity (path spaces models with an increasing time horizon, Boltzmann–Gibbs measures associated with decreasing temperature parameters, and many others). These models can also be seen as the evolution of the law of the random states of a nonlinear Markov chain. A natural way to simulate these sophisticated nonlinear Markov processes is to sample multiple copies of the process, replacing in the evolution equation the unknown distributions of the random states by the sampled empirical measures. In contrast with traditional Monte Carlo and MCMC methodologies, these mean-field particle techniques rely on sequential interacting samples. The terminology mean field reflects the fact that each of the samples (a.k.a. particles, individuals, walkers, agents, creatures, or phenotypes) interacts with the empirical measures of the process. When the size of the system tends to infinity, these random empirical measures converge to the deterministic distribution of the random states of the nonlinear Markov chain, so that the statistical interaction between particles vanishes.

Despite its conceptual and algorithmic simplicity, the computational cost associated with a Monte Carlo simulation can be staggeringly high. In general the method requires many samples to get a good approximation, which may incur an arbitrarily large total runtime if the processing time of a single sample is high. Although this is a severe limitation in very complex problems, the embarrassingly parallel nature of the algorithm allows this large cost to be reduced (perhaps to a feasible level) through parallel computing strategies in local processors, clusters, cloud computing, GPU, FPGA, etc.

What is the Russian for Monte Carlo technique? Translation of &#39Monte Carlo technique&#39 to Russi